#Database Integration Services
Explore tagged Tumblr posts
Text
Oracle Database Consulting Services to Power Your Enterprise
When your business relies on data to thrive, you can’t afford performance issues, security vulnerabilities, or downtime. That’s where Oracle Database Consulting Services come into play. At RalanTech, we offer end-to-end database consulting to help enterprises manage, optimize, and secure their Oracle environments.

Our experts work with clients across industries, helping them achieve higher uptime, faster performance, and scalable architecture tailored to business goals. Whether you're undergoing a digital transformation or need to enhance your legacy systems, our database consultants provide actionable insights and solutions.
We recognize that managing Oracle databases requires not just technical know-how, but a strategic approach to data lifecycle management. That’s why our consulting services are tailored to align with your business goals, budget, and compliance requirements.
Let’s explore what makes RalanTech a leading provider of DBA consulting services and why so many businesses in the U.S. trust us to manage their critical data infrastructure.
Why Choose RalanTech for Oracle Database Consulting Services?
Proven Expertise Across Industries
At RalanTech, our consultants bring years of experience working with Fortune 500 companies, startups, and government agencies. We offer specialized expertise in Oracle Database architecture, performance tuning, patching, upgrades, and cloud migrations.
Customized Solutions
No two businesses have the same database challenges. Our team tailors every database management consulting plan to meet your specific operational requirements—whether that means 24/7 monitoring, periodic health checks, or advanced disaster recovery planning.
End-to-End Database Services
From design and deployment to support and optimization, we cover all aspects of Oracle database management. Clients benefit from one seamless engagement instead of juggling multiple vendors.
ROI-Focused Consulting
We understand that database investments must deliver measurable returns. Our consultants focus on solutions that improve operational efficiency, reduce licensing costs, and increase system uptime, ultimately driving ROI.
Comprehensive Database Management Consulting You Can Trust
Strategic Planning and Assessment
Before any database enhancement or overhaul begins, RalanTech conducts a thorough assessment of your current environment. We analyze capacity planning, schema architecture, and workload balancing to build a robust optimization strategy.
Migration and Upgrades
Whether you're moving to Oracle 19c or transitioning your workloads to the cloud, our DBA consulting services ensure zero data loss, minimal downtime, and full compliance with regulatory standards.
Performance Tuning
Slow databases can cripple business operations. Our Oracle consultants use real-time monitoring tools and query optimization techniques to dramatically improve performance.
Backup and Disaster Recovery
Disaster can strike anytime. We set up automated, reliable backup and recovery systems that minimize data loss and ensure business continuity even in the worst-case scenarios.
DBA Consulting Services That Scale With You
24/7 Monitoring and Support
Downtime is costly. That’s why our Oracle-certified DBAs provide continuous monitoring and on-demand support to catch and resolve issues before they impact operations.
Security and Compliance
From HIPAA to SOX, our database management consulting experts ensure your Oracle systems adhere to the highest security standards. We implement access controls, encryption, and activity auditing as part of our core offering.
Cloud Integration and Hybrid Environments
We assist businesses in moving their databases to Oracle Cloud Infrastructure (OCI), Amazon RDS, or hybrid setups. Our team ensures secure, seamless integration with existing applications and infrastructure.
Capacity Planning and Future-Proofing
RalanTech consultants analyze current workloads and predict future demands, allowing your organization to scale without compromising performance or reliability.
Key Benefits of Our Oracle Database Consulting Services
Increased Performance and Speed
Our Oracle specialists fine-tune your database for optimized indexing, query execution plans, and memory usage, ensuring peak performance at all times.
Reduced Operational Costs
With RalanTech, you get proactive support, minimized downtime, and optimized licensing—all of which result in significant cost savings.
Improved Data Integrity
We enforce strong data governance policies and ensure that all database operations are aligned with compliance requirements and internal policies.
Greater Business Agility
Our DBA consulting services empower businesses to deploy new applications, adapt to market demands, and scale operations without database bottlenecks.
How RalanTech Stands Apart from Other Database Consulting Firms?
Agile Methodology
We follow an agile consulting approach that allows us to deliver incremental value at each project stage. Our clients stay in the loop from day one to go-live.
Certified Oracle Experts
All our consultants are Oracle Certified Professionals (OCP) with years of experience in complex environments. You're not just hiring a consultant—you're hiring a strategic partner.
Dedicated Account Management
RalanTech assigns a dedicated account manager to every project, ensuring personalized service and rapid response times.
Post-Implementation Support
We don’t disappear after go-live. Our ongoing support services ensure that your systems continue to run smoothly, securely, and efficiently.
Use Cases for Oracle Database Consulting Services
Financial Institutions
Banks and financial institutions rely heavily on Oracle for core transaction processing. Our services ensure they meet stringent performance and compliance demands.
Healthcare Providers
Patient data is sensitive and highly regulated. RalanTech delivers secure Oracle environments that meet HIPAA and HITECH compliance.
E-commerce Platforms
Online businesses require lightning-fast response times. We help optimize Oracle systems for maximum availability and conversion.
Government and Public Sector
We assist agencies in maintaining secure, compliant, and high-performing Oracle environments with robust disaster recovery plans.
Oracle DBA Consulting Services for the Cloud Era
Oracle Cloud Infrastructure (OCI) Optimization
We specialize in deploying and optimizing Oracle databases in OCI. Our consultants help design cloud-native solutions that maximize scalability and minimize cost.
Multi-Cloud and Hybrid Architectures
RalanTech helps businesses run Oracle databases across AWS, Azure, and on-prem setups. We handle the configuration, monitoring, and performance tuning.
Licensing and Cost Management
Oracle licensing can be complex and costly. Our experts help optimize licensing structures, avoid unnecessary purchases, and stay compliant with Oracle policies.
DevOps and Automation
We incorporate DevOps methodologies to automate Oracle database deployment, testing, and scaling—resulting in faster time-to-market and reduced errors.
How to Get Started with RalanTech?
Free Initial Consultation
We offer a free 30-minute consultation to understand your current challenges and goals. This helps us create a roadmap tailored to your needs.
Clear Project Milestones
Every project is broken down into milestones, ensuring transparency, accountability, and timely delivery.
Flexible Engagement Models
Choose from on-demand, project-based, or managed service contracts. We adapt our delivery model to your business requirements.
Proven Track Record
Our case studies and testimonials speak for themselves. Clients report increased reliability, reduced costs, and improved user satisfaction after partnering with RalanTech.
FAQs – Oracle Database Consulting Services
Q1: What industries does RalanTech support with Oracle database consulting?
We serve a wide range of industries including finance, healthcare, retail, government, and manufacturing—any sector that depends on secure and high-performance database systems.
Q2: Can RalanTech help migrate our existing database to Oracle Cloud?
Absolutely. Our team specializes in Oracle Cloud Infrastructure (OCI) migration, ensuring minimal disruption and complete data integrity.
Q3: Do you offer 24/7 DBA support?
Yes, we offer round-the-clock monitoring and support as part of our managed DBA consulting services.
Q4: How do you ensure compliance with data regulations?
We follow industry best practices and implement data governance frameworks that adhere to HIPAA, SOX, GDPR, and other compliance mandates.
Q5: What sets your database management consulting apart?
Our personalized approach, certified Oracle experts, and proven track record in delivering high-performance solutions make us a trusted partner.
Q6: Do you support hybrid and multi-cloud database environments?
Yes, we help clients manage Oracle databases across on-prem, cloud, and hybrid infrastructures including AWS, Azure, and OCI.
Q7: Is there a minimum contract period for DBA consulting services?
We offer flexible engagement models including short-term and long-term contracts to suit your needs.
Q8: What tools do you use for performance monitoring?
We leverage industry-leading tools like Oracle Enterprise Manager (OEM), SolarWinds, and custom dashboards for real-time performance insights.
Q9: Can RalanTech help reduce Oracle licensing costs?
Yes, we analyze your current licensing and recommend optimizations that ensure compliance while reducing unnecessary expenditure.
Q10: How do I get started?
Just reach out via https://www.ralantech.com or book your free consultation today. Our Oracle experts are ready to assist.
#database consulting projects#database consultancy solutions#Database Integration Services#Database replication#Database clustering#Data Partitioning#Database mirroring
0 notes
Text
It integration
0 notes
Text
Discover how Database Performance Tuning & Optimization Services by Simple Logic can boost your database efficiency. Learn how optimizing query performance, enhancing data retrieval, and reducing system lag can improve user experience and reduce costs. Perfect for businesses seeking faster data processing and streamlined operations! 📊⚙️🚀
To learn more click here
#simplelogic#makingitsimple#simplelogicit#database#database management#database integration#database migration services
0 notes
Text
Need a reliable freelance back-end developer? I offer expert services in building scalable APIs, managing databases, and optimizing server-side performance. With a focus on tailored solutions and efficient code, I'll help bring your project to life. Contact me for top-notch development!
#website design#freelance back end developer#api integration#backenddevelopment#database management#back end services
0 notes
Text
Professional MySQL Development Services
Professional MySQL development services to support your database management needs. Their skilled developers design and implement robust MySQL solutions that ensure data integrity, security, and performance. Trust us to deliver reliable and efficient database services for your business.
#artificial intelligence#web development#tech#technology#data integrity#data security#database services
1 note
·
View note
Text
Vistabyte Innovations - Database Management Company in India
The venture that started in January 2024, understands the world of database maintenance and how it demands a proven track record of excellence and innovation. At Vistabyte Innovations we offer Database Administrator (DBA) certified experts who ensure that database availability, functionality, and performance are managed and monitored to cover any issues that may arise.
We are not just a technology company but a dedicated ally to your data. With the simple mission of ensuring your databases run flawlessly, our team focuses on what matters most – growing your business.
At VistaByte Innovations, we recognize the vital role a well-maintained database plays in the smooth operation of your applications. We believe that a strong and efficient database system is essential for any successful business. That’s why we provide 24/7 support for your databases, ensuring they always perform at their best, no matter the user load or data volume. Our team of highly skilled professionals offers proactive monitoring, performance tuning, and timely troubleshooting to guarantee uninterrupted access to your valuable data. We don’t just offer comprehensive support for popular database management systems; we customize our solutions to fit your unique business needs. Whether it’s installation, configuration, ongoing maintenance, or complex data migrations, our team is here to assist you every step of the way.
1 note
·
View note
Text
Advanced WordPress Caching
Speed is important in the fast-paced world of online presence. Users expect websites to load in the blink of an eye, and search engines reward faster sites with better rankings. If you’re a WordPress enthusiast, you probably already know how crucial caching is to maximising the functionality of your website. We’ll explore sophisticated WordPress caching strategies in this blog post, which can significantly increase the speed of your website.
#WordPress Caching#Advanced Caching Techniques#Browser Caching#Browser Caching in WordPress#CDN Integration for WordPress Caching#Fragment Caching#JNext#JNext Services#Object Caching in WordPress#Opcode Caching for WordPress#Page Caching Best Practices#Performance Optimization with WordPress Caching#Redis Cache for WordPress#Server-Side Caching in WordPress#Speeding up WordPress Sites#Tuning Database Caching in WordPress#WordPress Cache Management Tips
0 notes
Text
"Efficient Time Tracking and Labor Management: Rapidflow Inc's Oracle Solutions"
Oracle Time and Labor Management is a comprehensive solution provided by Oracle Corporation to help organizations manage and optimize workforce time tracking and attendance. This system is designed to streamline and automate various aspects of time and labor management, enhancing efficiency and accuracy in tracking employee time.
for more details visit us at https://www.rapidflowapps.com/biometric-device-and-oracle-time-and-labour-integration/
Key features and functionalities of Oracle Time and Labor Management may include:
Time Tracking:
Enables employees to log their work hours accurately.
Supports various time entry methods, such as web-based entry, mobile apps, and time clocks.
Integration with other Oracle applications, such as Oracle Human Resources and Oracle Payroll, for seamless data flow.
Attendance Management:
Tracks employee attendance, including absences, tardiness, and leaves.
Generates reports on attendance patterns for individual employees or entire teams.
Scheduling:
Helps in creating and managing employee schedules.
Optimizes schedules based on workload, employee availability, and business rules.
Compliance and Policy Enforcement:
Ensures compliance with labor laws and company policies.
Enforces rules related to overtime, breaks, and other labor regulations.
Reporting and Analytics:
Provides robust reporting and analytics capabilities to analyze workforce data.
Enables data-driven decision-making for workforce management.
Integration:
Integrates seamlessly with other Oracle applications and third-party systems.
Supports data exchange between Time and Labor Management and other HR and payroll systems.
Mobile Accessibility:
Allows employees and managers to access the system and perform necessary tasks through mobile devices.
Configurability:
Offers customization options to adapt to the specific needs and workflows of different organizations.
Audit Trails:
Maintains detailed audit trails for all time-related transactions, ensuring data integrity and compliance.
User-Friendly Interface:
Provides an intuitive and user-friendly interface for easy adoption by employees and managers.
Oracle Time and Labor Management is part of Oracle's broader suite of Human Capital Management (HCM) solutions, contributing to the efficient management of workforce resources and enhancing organizational productivity. Organizations can leverage this system to improve accuracy in payroll processing, monitor labor costs, and comply with regulatory requirements.
#database services#hcm services#hr and payroll services#integration services#oracle applications#rapidflow inc services#time and labor management#time and labor management services#time tracking#workforce resources
0 notes
Text
Software Development Company in Dwarka
Our software development company in Dwarka is a leading provider of innovative and high-quality software solutions. We specialize in developing custom software solutions for businesses of all sizes, from startups to large enterprises. Our team of experienced software developers and designers work closely with our clients to understand their unique needs and requirements, and then create custom software solutions that meet their specific needs. We use the latest technologies and tools to ensure that our software solutions are efficient, scalable, and secure. Our company is committed to delivering exceptional customer service and support, and we pride ourselves on our ability to exceed our clients' expectations.
#software development#consulting#web development#mobile development#database management#xcrino#system integration#business solutions#IT services#software development company#software development firm#software development agency#software development team
0 notes
Text
Generative AI Policy (February 9, 2024)
As of February 9, 2024, we are updating our Terms of Service to prohibit the following content:
Images created through the use of generative AI programs such as Stable Diffusion, Midjourney, and Dall-E.
This post explains what that means for you. We know it’s impossible to remove all images created by Generative AI on Pillowfort. The goal of this new policy, however, is to send a clear message that we are against the normalization of commercializing and distributing images created by Generative AI. Pillowfort stands in full support of all creatives who make Pillowfort their home. Disclaimer: The following policy was shaped in collaboration with Pillowfort Staff and international university researchers. We are aware that Artificial Intelligence is a rapidly evolving environment. This policy may require revisions in the future to adapt to the changing landscape of Generative AI.
-
Why is Generative AI Banned on Pillowfort?
Our Terms of Service already prohibits copyright violations, which includes reposting other people’s artwork to Pillowfort without the artist’s permission; and because of how Generative AI draws on a database of images and text that were taken without consent from artists or writers, all Generative AI content can be considered in violation of this rule. We also had an overwhelming response from our user base urging us to take action on prohibiting Generative AI on our platform.
-
How does Pillowfort define Generative AI?
As of February 9, 2024 we define Generative AI as online tools for producing material based on large data collection that is often gathered without consent or notification from the original creators.
Generative AI tools do not require skill on behalf of the user and effectively replace them in the creative process (ie - little direction or decision making taken directly from the user). Tools that assist creativity don't replace the user. This means the user can still improve their skills and refine over time.
For example: If you ask a Generative AI tool to add a lighthouse to an image, the image of a lighthouse appears in a completed state. Whereas if you used an assistive drawing tool to add a lighthouse to an image, the user decides the tools used to contribute to the creation process and how to apply them.
Examples of Tools Not Allowed on Pillowfort: Adobe Firefly* Dall-E GPT-4 Jasper Chat Lensa Midjourney Stable Diffusion Synthesia
Example of Tools Still Allowed on Pillowfort:
AI Assistant Tools (ie: Google Translate, Grammarly) VTuber Tools (ie: Live3D, Restream, VRChat) Digital Audio Editors (ie: Audacity, Garage Band) Poser & Reference Tools (ie: Poser, Blender) Graphic & Image Editors (ie: Canva, Adobe Photoshop*, Procreate, Medibang, automatic filters from phone cameras)
*While Adobe software such as Adobe Photoshop is not considered Generative AI, Adobe Firefly is fully integrated in various Adobe software and falls under our definition of Generative AI. The use of Adobe Photoshop is allowed on Pillowfort. The creation of an image in Adobe Photoshop using Adobe Firefly would be prohibited on Pillowfort.
-
Can I use ethical generators?
Due to the evolving nature of Generative AI, ethical generators are not an exception.
-
Can I still talk about AI?
Yes! Posts, Comments, and User Communities discussing AI are still allowed on Pillowfort.
-
Can I link to or embed websites, articles, or social media posts containing Generative AI?
Yes. We do ask that you properly tag your post as “AI” and “Artificial Intelligence.”
-
Can I advertise the sale of digital or virtual goods containing Generative AI?
No. Offsite Advertising of the sale of goods (digital and physical) containing Generative AI on Pillowfort is prohibited.
-
How can I tell if a software I use contains Generative AI?
A general rule of thumb as a first step is you can try testing the software by turning off internet access and seeing if the tool still works. If the software says it needs to be online there’s a chance it’s using Generative AI and needs to be explored further.
You are also always welcome to contact us at [email protected] if you’re still unsure.
-
How will this policy be enforced/detected?
Our Team has decided we are NOT using AI-based automated detection tools due to how often they provide false positives and other issues. We are applying a suite of methods sourced from international universities responding to moderating material potentially sourced from Generative AI instead.
-
How do I report content containing Generative AI Material?
If you are concerned about post(s) featuring Generative AI material, please flag the post for our Site Moderation Team to conduct a thorough investigation. As a reminder, Pillowfort’s existing policy regarding callout posts applies here and harassment / brigading / etc will not be tolerated.
Any questions or clarifications regarding our Generative AI Policy can be sent to [email protected].
2K notes
·
View notes
Text
The so-called Department of Government Efficiency (DOGE) is starting to put together a team to migrate the Social Security Administration’s (SSA) computer systems entirely off one of its oldest programming languages in a matter of months, potentially putting the integrity of the system—and the benefits on which tens of millions of Americans rely—at risk.
The project is being organized by Elon Musk lieutenant Steve Davis, multiple sources who were not given permission to talk to the media tell WIRED, and aims to migrate all SSA systems off COBOL, one of the first common business-oriented programming languages, and onto a more modern replacement like Java within a scheduled tight timeframe of a few months.
Under any circumstances, a migration of this size and scale would be a massive undertaking, experts tell WIRED, but the expedited deadline runs the risk of obstructing payments to the more than 65 million people in the US currently receiving Social Security benefits.
“Of course, one of the big risks is not underpayment or overpayment per se; [it’s also] not paying someone at all and not knowing about it. The invisible errors and omissions,” an SSA technologist tells WIRED.
The Social Security Administration did not immediately reply to WIRED’s request for comment.
SSA has been under increasing scrutiny from president Donald Trump’s administration. In February, Musk took aim at SSA, falsely claiming that the agency was rife with fraud. Specifically, Musk pointed to data he allegedly pulled from the system that showed 150-year-olds in the US were receiving benefits, something that isn’t actually happening. Over the last few weeks, following significant cuts to the agency by DOGE, SSA has suffered frequent website crashes and long wait times over the phone, The Washington Post reported this week.
This proposed migration isn’t the first time SSA has tried to move away from COBOL: In 2017, SSA announced a plan to receive hundreds of millions in funding to replace its core systems. The agency predicted that it would take around five years to modernize these systems. Because of the coronavirus pandemic in 2020, the agency pivoted away from this work to focus on more public-facing projects.
Like many legacy government IT systems, SSA systems contain code written in COBOL, a programming language created in part in the 1950s by computing pioneer Grace Hopper. The Defense Department essentially pressured private industry to use COBOL soon after its creation, spurring widespread adoption and making it one of the most widely used languages for mainframes, or computer systems that process and store large amounts of data quickly, by the 1970s. (At least one DOD-related website praising Hopper's accomplishments is no longer active, likely following the Trump administration’s DEI purge of military acknowledgements.)
As recently as 2016, SSA’s infrastructure contained more than 60 million lines of code written in COBOL, with millions more written in other legacy coding languages, the agency’s Office of the Inspector General found. In fact, SSA’s core programmatic systems and architecture haven’t been “substantially” updated since the 1980s when the agency developed its own database system called MADAM, or the Master Data Access Method, which was written in COBOL and Assembler, according to SSA’s 2017 modernization plan.
SSA’s core “logic” is also written largely in COBOL. This is the code that issues social security numbers, manages payments, and even calculates the total amount beneficiaries should receive for different services, a former senior SSA technologist who worked in the office of the chief information officer says. Even minor changes could result in cascading failures across programs.
“If you weren't worried about a whole bunch of people not getting benefits or getting the wrong benefits, or getting the wrong entitlements, or having to wait ages, then sure go ahead,” says Dan Hon, principal of Very Little Gravitas, a technology strategy consultancy that helps government modernize services, about completing such a migration in a short timeframe.
It’s unclear when exactly the code migration would start. A recent document circulated amongst SSA staff laying out the agency’s priorities through May does not mention it, instead naming other priorities like terminating “non-essential contracts” and adopting artificial intelligence to “augment” administrative and technical writing.
Earlier this month, WIRED reported that at least 10 DOGE operatives were currently working within SSA, including a number of young and inexperienced engineers like Luke Farritor and Ethan Shaotran. At the time, sources told WIRED that the DOGE operatives would focus on how people identify themselves to access their benefits online.
Sources within SSA expect the project to begin in earnest once DOGE identifies and marks remaining beneficiaries as deceased and connecting disparate agency databases. In a Thursday morning court filing, an affidavit from SSA acting administrator Leland Dudek said that at least two DOGE operatives are currently working on a project formally called the “Are You Alive Project,” targeting what these operatives believe to be improper payments and fraud within the agency’s system by calling individual beneficiaries. The agency is currently battling for sweeping access to SSA’s systems in court to finish this work. (Again, 150-year-olds are not collecting social security benefits. That specific age was likely a quirk of COBOL. It doesn’t include a date type, so dates are often coded to a specific reference point—May 20, 1875, the date of an international standards-setting conference held in Paris, known as the Convention du Mètre.)
In order to migrate all COBOL code into a more modern language within a few months, DOGE would likely need to employ some form of generative artificial intelligence to help translate the millions of lines of code, sources tell WIRED. “DOGE thinks if they can say they got rid of all the COBOL in months, then their way is the right way, and we all just suck for not breaking shit,” says the SSA technologist.
DOGE would also need to develop tests to ensure the new system’s outputs match the previous one. It would be difficult to resolve all of the possible edge cases over the course of several years, let alone months, adds the SSA technologist.
“This is an environment that is held together with bail wire and duct tape,” the former senior SSA technologist working in the office of the chief information officer tells WIRED. “The leaders need to understand that they’re dealing with a house of cards or Jenga. If they start pulling pieces out, which they’ve already stated they’re doing, things can break.”
260 notes
·
View notes
Quote
As Musk brings his staff to the Office of Personnel Management, senior officials' access to data systems is being revoked. "We have no visibility into what they are doing with the computer and data systems," one of the officials said. "That is creating great concern. There is no oversight. It creates real cybersecurity and hacking implications." One key database, the officials cited, is Enterprise Human Resources Integration, which has "all of the birthdates, Social Security numbers, appraisals, home addresses, pay grades and lengths of service of government workers." Reuters spoke with University of Michigan Professor Don Moynihan, at the Ford School of Public Policy, who warned th at there doesn't seem to be any congressional oversight over Trump and Musk. "This makes it much harder for anyone outside Musk's inner circle at OPM to know what's going on," Moynihan said. The officials said they still have the power to log on and access their emails but there's no access to the massive datasets they managed. Musk demands that his team work overnight and 80-hour weeks to find all of the necessary cuts, violating federal labor laws unless a worker is paid overtime. However, it's unclear whether Musk or American taxpayers are paying those workers. Musk had sofa beds brought into the OPM office on Jan. 20, the day Trump took office, to ensure his personal team could work non-stop. The area can only be accessed with a special security badge or a security escort, an OPM employee said.
'Great concern': Musk aides reportedly lock career civil servants out of computer systems - Raw Story
164 notes
·
View notes
Text
Musk aides lock government workers out of computer systems at US agency, sources say
Aides to Elon Musk charged with running the U.S. government human resources agency have locked career civil servants out of computer systems that contain the personal data of millions of federal employees, according to two agency officials. [...] The two officials, who spoke to Reuters on condition of anonymity for fear of retaliation, said some senior career employees at OPM have had their access revoked to some of the department's data systems. The systems include a vast database called Enterprise Human Resources Integration, which contains dates of birth, Social Security numbers, appraisals, home addresses, pay grades and length of service of government workers, the officials said. "We have no visibility into what they are doing with the computer and data systems," one of the officials said. "That is creating great concern. There is no oversight. It creates real cybersecurity and hacking implications." [...] A team including current and former employees of Musk assumed command of OPM on Jan. 20, the day Trump took office. They have moved sofa beds onto the fifth floor of the agency's headquarters, which contains the director's office and can only be accessed with a security badge or a security escort, one of the OPM employees said. The sofa beds have been installed so the team can work around the clock, the employee said. "It feels like a hostile takeover," the employee said.
this can only be solved with coercion at this point
120 notes
·
View notes
Text
Elon Musk staff has been caught plugging in hard drives inside the Office of Personnel Management (OPM), the Treasury Department, and the General Services Administration (GSA).
Elon Musk staff has been caught plugging in hard drives inside the Office of Personnel Management (OPM), the Treasury Department, and the General Services Administration (GSA). His staff encountered resistance when demanding that Treasury officials grant access to systems managing the flow of more than $6 trillion annually to programs like Social Security and Medicare. Tensions escalated when Musk’s aides were discovered at OPM accessing systems, including a vast database known as the Enterprise Human Resources Integration (EHRI), which contains sensitive information such as dates of birth, Social Security numbers, performance appraisals, home addresses, pay grades, and length of service for government employees. In response to employees speaking out, Musk’s aides locked civil servants out of computer systems and offices, with reports of personal items being searched.
88 notes
·
View notes
Text
Automatically clean up data sets to prevent ‘garbage in, garbage out’ - Technology Org
New Post has been published on https://thedigitalinsider.com/automatically-clean-up-data-sets-to-prevent-garbage-in-garbage-out-technology-org/
Automatically clean up data sets to prevent ‘garbage in, garbage out’ - Technology Org
‘Garbage in, garbage out’ has become a winged expression for the concept that flawed input data will lead to flawed output data. Practical examples abound. If a dataset contains temperature readings in Celsius and Fahrenheit without proper conversion, any analysis based on that data will be flawed. If the input data for a gift recommender system contains errors in the age attribute of customers, it might accidentally suggest kids toys to grown-ups.
Illustration by Heyerlein via Unsplash, free license
At a time when more and more companies, organizations and governments base decisions on data analytics, it is highly important to ensure good and clean data sets. That is what Sebastian Schelter and his colleagues are working on. Schelter is assistant professor at of the Informatics Institute of the University of Amsterdam, working in the Intelligent Data Engineering Lab (INDElab). Academic work he published in 2018, when he was working at Amazon, presently powers some of Amazon’s data quality services. At UvA he is expanding that work.
What are the biggest problems with data sets?
‘Missing data is one big problem. Think of an Excel sheet where you have to fill in values in each cell, but some cells are empty. May be data got lost, may be data just wasn’t collected. That’s a very common problem. The second big problem is that some data are wrong. Let’s say you have data about the age of people and there appears to be somebody who is a thousand years old.
A third major problem with data sets is data integration errors, which arise from combining different data sets. Very often this leads to duplicates. Think of two companies that merge. They will have address data bases and may be the same address is spelled in slightly different ways: one database uses ‘street’ and the other one uses ‘st.’. Or the spelling might be different.
Finally, the fourth major problem is called ‘referential integrity’. If you have datasets that reference each other, you need to make sure that the referencing is done correctly. If a company has a dataset with billing data and a bank has a dataset with bank account numbers of their customers, you want a bank account number in the billing dataset to reference an existing bank account at that bank, otherwise it would reference something that does not exist. Often there are problems with references between two data sets.’
How does your research tackle these problems?
‘Data scientists spend a lot of their time cleaning up flawed data sets. The numbers vary, but surveys have shown that it’s up to eighty percent of their time. That’s a big waste of time and talent. To counter this, we have developed open source software, called Deequ. Instead of data scientists having to write a program that validates the data quality, they can just write down how their data should look like. For example, they can prescribe things like: ‘there shouldn’t be missing data in the column with social security numbers’ or ‘the values in the age-column shouldn’t be negative’. Then Deequ runs over the data in an efficient way and tells whether the test is passed or not. Often Deequ also shows the particular data records that violated the test.’
How is Deequ used in practice?
‘The original scientific paper was written when I was working at Amazon. Since then, the open source implementation of this work has become pretty popular for all kinds of applications in all kinds of domains. There is a Python-version which has more than four million downloads per month. After I left Amazon, the company built two cloud services based on Deequ, one of them called AWS Glue Data Quality. Amazon’s cloud is the most used cloud in the world, so many companies that use it, have access to our way of data cleaning.’
What is the current research you are doing to clean up data sets?
‘At the moment we are developing a way to measure data quality of streaming data in our ICAI-lab ‘AI for Retail’, cooperating with bol.com. Deequ was developed for data at rest, but many use cases have a continuous stream of data. The data might be too big to store, there might be privacy reasons for not storing them, or it might simply be too expensive to store the data. So, we built StreamDQ, which can run quality checks on streaming data. A big challenge is that you can’t spend much time on processing the data, otherwise everything will be slowed down too much. So, you can only do certain tests and sometimes you have to use approximations. We have a working prototype, and we are now evaluating it.’
Source: University of Amsterdam
You can offer your link to a page which is relevant to the topic of this post.
#ai#Amazon#Analysis#Analytics#applications#AWS#bases#cell#Cells#challenge#Cloud#cloud services#Companies#data#data analytics#data cleaning#Data Engineering#Data Integration#data quality#Database#datasets#domains#engineering#excel#excel sheet#Featured technology news#how#illustration#integration#it
0 notes
Text
The Story of KLogs: What happens when an Mechanical Engineer codes
Since i no longer work at Wearhouse Automation Startup (WAS for short) and havnt for many years i feel as though i should recount the tale of the most bonkers program i ever wrote, but we need to establish some background
WAS has its HQ very far away from the big customer site and i worked as a Field Service Engineer (FSE) on site. so i learned early on that if a problem needed to be solved fast, WE had to do it. we never got many updates on what was coming down the pipeline for us or what issues were being worked on. this made us very independent
As such, we got good at reading the robot logs ourselves. it took too much time to send the logs off to HQ for analysis and get back what the problem was. we can read. now GETTING the logs is another thing.
the early robots we cut our teeth on used 2.4 gHz wifi to communicate with FSE's so dumping the logs was as simple as pushing a button in a little application and it would spit out a txt file
later on our robots were upgraded to use a 2.4 mHz xbee radio to communicate with us. which was FUCKING SLOW. and log dumping became a much more tedious process. you had to connect, go to logging mode, and then the robot would vomit all the logs in the past 2 min OR the entirety of its memory bank (only 2 options) into a terminal window. you would then save the terminal window and open it in a text editor to read them. it could take up to 5 min to dump the entire log file and if you didnt dump fast enough, the ACK messages from the control server would fill up the logs and erase the error as the memory overwrote itself.
this missing logs problem was a Big Deal for software who now weren't getting every log from every error so a NEW method of saving logs was devised: the robot would just vomit the log data in real time over a DIFFERENT radio and we would save it to a KQL server. Thanks Daddy Microsoft.
now whats KQL you may be asking. why, its Microsofts very own SQL clone! its Kusto Query Language. never mind that the system uses a SQL database for daily operations. lets use this proprietary Microsoft thing because they are paying us
so yay, problem solved. we now never miss the logs. so how do we read them if they are split up line by line in a database? why with a query of course!
select * from tbLogs where RobotUID = [64CharLongString] and timestamp > [UnixTimeCode]
if this makes no sense to you, CONGRATULATIONS! you found the problem with this setup. Most FSE's were BAD at SQL which meant they didnt read logs anymore. If you do understand what the query is, CONGRATULATIONS! you see why this is Very Stupid.
You could not search by robot name. each robot had some arbitrarily assigned 64 character long string as an identifier and the timestamps were not set to local time. so you had run a lookup query to find the right name and do some time zone math to figure out what part of the logs to read. oh yeah and you had to download KQL to view them. so now we had both SQL and KQL on our computers
NOBODY in the field like this.
But Daddy Microsoft comes to the rescue
see we didnt JUST get KQL with part of that deal. we got the entire Microsoft cloud suite. and some people (like me) had been automating emails and stuff with Power Automate
This is Microsoft Power Automate. its Microsoft's version of Scratch but it has hooks into everything Microsoft. SharePoint, Teams, Outlook, Excel, it can integrate with all of it. i had been using it to send an email once a day with a list of all the robots in maintenance.
this gave me an idea
and i checked
and Power Automate had hooks for KQL
KLogs is actually short for Kusto Logs
I did not know how to program in Power Automate but damn it anything is better then writing KQL queries. so i got to work. and about 2 months later i had a BEHEMOTH of a Power Automate program. it lagged the webpage and many times when i tried to edit something my changes wouldn't take and i would have to click in very specific ways to ensure none of my variables were getting nuked. i dont think this was the intended purpose of Power Automate but this is what it did
the KLogger would watch a list of Teams chats and when someone typed "klogs" or pasted a copy of an ERROR mesage, it would spring into action.
it extracted the robot name from the message and timestamp from teams
it would lookup the name in the database to find the 64 long string UID and the location that robot was assigned too
it would reply to the message in teams saying it found a robot name and was getting logs
it would run a KQL query for the database and get the control system logs then export then into a CSV
it would save the CSV with the a .xls extension into a folder in ShairPoint (it would make a new folder for each day and location if it didnt have one already)
it would send ANOTHER message in teams with a LINK to the file in SharePoint
it would then enter a loop and scour the robot logs looking for the keyword ESTOP to find the error. (it did this because Kusto was SLOWER then the xbee radio and had up to a 10 min delay on syncing)
if it found the error, it would adjust its start and end timestamps to capture it and export the robot logs book-ended from the event by ~ 1 min. if it didnt, it would use the timestamp from when it was triggered +/- 5 min
it saved THOSE logs to SharePoint the same way as before
it would send ANOTHER message in teams with a link to the files
it would then check if the error was 1 of 3 very specific type of error with the camera. if it was it extracted the base64 jpg image saved in KQL as a byte array, do the math to convert it, and save that as a jpg in SharePoint (and link it of course)
and then it would terminate. and if it encountered an error anywhere in all of this, i had logic where it would spit back an error message in Teams as plaintext explaining what step failed and the program would close gracefully
I deployed it without asking anyone at one of the sites that was struggling. i just pointed it at their chat and turned it on. it had a bit of a rocky start (spammed chat) but man did the FSE's LOVE IT.
about 6 months later software deployed their answer to reading the logs: a webpage that acted as a nice GUI to the KQL database. much better then an CSV file
it still needed you to scroll though a big drop-down of robot names and enter a timestamp, but i noticed something. all that did was just change part of the URL and refresh the webpage
SO I MADE KLOGS 2 AND HAD IT GENERATE THE URL FOR YOU AND REPLY TO YOUR MESSAGE WITH IT. (it also still did the control server and jpg stuff). Theres a non-zero chance that klogs was still in use long after i left that job
now i dont recommend anyone use power automate like this. its clunky and weird. i had to make a variable called "Carrage Return" which was a blank text box that i pressed enter one time in because it was incapable of understanding /n or generating a new line in any capacity OTHER then this (thanks support forum).
im also sure this probably is giving the actual programmer people anxiety. imagine working at a company and then some rando you've never seen but only heard about as "the FSE whos really good at root causing stuff", in a department that does not do any coding, managed to, in their spare time, build and release and entire workflow piggybacking on your work without any oversight, code review, or permission.....and everyone liked it
#comet tales#lazee works#power automate#coding#software engineering#it was so funny whenever i visited HQ because i would go “hi my name is LazeeComet” and they would go “OH i've heard SO much about you”
64 notes
·
View notes